Kernel conjugate gradient methods with random projections

نویسندگان

چکیده

We propose and study kernel conjugate gradient methods (KCGM) with random projections for least-squares regression over a separable Hilbert space. Considering two types of generated by randomized sketches Nyström subsampling, we prove optimal statistical results respect to variants norms the algorithms under suitable stopping rule. Particularly, our show that if projection dimension is proportional effective problem, KCGM can generalize optimally, while achieving computational advantage. As corollary, derive rates classic in well-conditioned regimes case target function may not be hypothesis

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Kernel Conjugate Gradient

We propose a novel variant of conjugate gradient based on the Reproducing Kernel Hilbert Space (RKHS) inner product. An analysis of the algorithm suggests it enjoys better performance properties than standard iterative methods when applied to learning kernel machines. Experimental results for both classification and regression bear out the theoretical implications. We further address the domina...

متن کامل

Kernel Conjugate Gradient for Fast Kernel Machines

We propose a novel variant of the conjugate gradient algorithm, Kernel Conjugate Gradient (KCG), designed to speed up learning for kernel machines with differentiable loss functions. This approach leads to a better conditioned optimization problem during learning. We establish an upper bound on the number of iterations for KCG that indicates it should require less than the square root of the nu...

متن کامل

Kernel Conjugate Gradient is Universally Consistent

We study the statistical consistency of conjugate gradient applied to a bounded regression learning problem seen as an inverse problem defined in a reproducing kernel Hilbert space. This approach leads to an estimator that stands out of the well-known classical approaches, as it is not defined as the solution of a global cost minimization procedure over a fixed model nor is it a linear estimato...

متن کامل

Application of frames in Chebyshev and conjugate gradient methods

‎Given a frame of a separable Hilbert space $H$‎, ‎we present some‎ ‎iterative methods for solving an operator equation $Lu=f$‎, ‎where $L$ is a bounded‎, ‎invertible and symmetric‎ ‎operator on $H$‎. ‎We present some algorithms‎ ‎based on the knowledge of frame bounds‎, ‎Chebyshev method and conjugate gradient method‎, ‎in order to give some‎ ‎approximated solutions to the problem‎. ‎Then we i...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Applied and Computational Harmonic Analysis

سال: 2021

ISSN: ['1096-603X', '1063-5203']

DOI: https://doi.org/10.1016/j.acha.2021.05.004